542 research outputs found

    Principal Components and Factor Analysis. A Comparative Study.

    Get PDF
    A comparison between Principal Component Analysis (PCA) and Factor Analysis (FA) is performed both theoretically and empirically for a random matrix X:(n x p) , where n is the number of observations and both coordinates may be very large. The comparison surveys the asymptotic properties of the factor scores, of the singular values and of all other elements involved, as well as the characteristics of the methods utilized for detecting the true dimension of X. In particular, the norms of the FA scores, whichever their number, and the norms of their covariance matrix are shown to be always smaller and to decay faster as n goes to infinity. This causes the FA scores, when utilized as regressors and/or instruments, to produce more efficient slope estimators in instrumental variable estimation. Moreover, as compared to PCA, the FA scores and factors exhibit a higher degree of consistency because the difference between the estimated and their true counterparts is smaller, and so is also the corresponding variance. Finally, FA usually selects a much less encumbering number of scores than PCA, greatly facilitating the search and identification of the common components of X

    Dynamic Econometric Testing of Climate Change and of its Causes

    Get PDF
    The goal of this paper is to empirically test for structural breaks of world mean temperatures that may have ignited at some date the phenomenon known as “Climate Change” or “Global Warming”. Estimation by means of the dynamic Generalized Method of Moments is conducted on a large dataset spanning the recordable period from 1850 until present, and different tests and selection procedures among competing model specifications are utilized, such as Principal Component and Principal Factor Analysis, instrument validity, overtime changes in parameters and in shares of both natural and anthropogenic forcings. The results of estimation unmistakably show no involvement of anthropogenic forcings and no occurrence of significant breaks in world mean temperatures. Hence the hypothesis of a climate change in the last 150 years, suggested by the advocates of Global Warming, is rejected. Pacific Decadal Oscillations, sunspots and the major volcanic eruptions play the lion’s share in determining world temperatures, the first being a dimmer and the others substantial warmers.Generalized Method of Moments, Global Warming, Principal Component and Factor Analysis, Structural Breaks.

    Property Crime and Law Enforcement in Italy. A Regional Panel Analysis 1980-95

    Get PDF
    In this paper a Cobb-Douglas utility function is introduced and solved for a dynamic equation of property crime supply and its determinants, namely deterrents and income. Thereafter, all variables are empirically tested, by means of a simultaneous equations model, for the sign and magnitude of their mutual relationships in a panel of Italy and its two economically and culturally different areas, the North and the South. The period scrutinized is 1980-95 and the results obtained widely differ among the two. When appropriately modeled and instrumented, in fact, property crime is found to react to police and criminal justice deterrence, and also to incomes, with different parameter magnitudes and significance. The same diversity applies to the parameters related to deterrence, flawed in quite a few cases by scarce law enforcement and productivity, and to those related to local incomes, which still reflect for the South a tendency of crime to substitute for legal activities.Models with Panel Data, Illegal Behavior and the Enforcement of Law

    Dynamic GMM Estimation With Structural Breaks. An Application to Global Warming and its Causes.

    Get PDF
    In this paper I propose a nonstandard t-test statistic for detecting level and trend breaks of I(0) series. Theoretical and limit-distribution critical values obtained from Montecarlo experimentation are supplied. The null hypothesis of anthropogenic versus natural causes of global warming is then tested for the period 1850-2006 by means of a dynamic GMM model which incorporates the null of breaks of anthropogenic origin. World average temperatures are found to be tapering off since a few decades by now, and to exhibit no significant breaks attributable to human activities. While these play a minor causative role in climate changes, most natural forcings and in particular solar sunspots are major warmers. Finally, in contrast to widely held opinions, greenhouse gases are in general temperature dimmers.Generalized Method of Moments, Multiple Breaks, Principal Component Analysis, Global Warming

    The U.S. Dynamic Taylor Rule With Multiple Breaks, 1984-2001.

    Get PDF
    This paper combines two major strands of literature: structural breaks and Taylor rules. At first, I propose a nonstandard t-test statistic for detecting multiple level and trend breaks of I(0) series by supplying theoretical and limit-distribution critical values obtained from Montecarlo experimentation. Thereafter, I introduce a forward-looking Taylor rule expressed as a dynamic model which allows for multiple breaks and reaction-function coefficients of the leads of inflation, of the output gap and of an equity market index. Sequential GMM estimation of the model, applied to the Effective Federal Funds Rate for the period 1984:01-2001:06, produces three main interesting results: the existence of significant structural breaks, the substantial role played by inflation in the FOMC decisions and a marked equity targeting policy approach. Such results reveal departures from rationality, determined by structured and unstructured uncertainty, which the Fed systematically attempts at reducing by administering inflation scares and misinformation about the actual Phillips curve, in order to keep the output and equity markets under control.Generalized Method of Moments; Monetary Policy Rules; Multiple Breaks

    Supervised Principal Components and Factor Instrumental Variables. An Application to Violent CrimeTrends in the US, 1982-2005.

    Get PDF
    Supervised Principal Component Analysis (SPCA) and Factor Instrumental Variables (FIV) are competing methods addressed at estimating models affected by regressor collinearity and at detecting a reduced-size instrument set from a large database, possibly dominated by non-exogeneity and weakness. While the first method stresses the role of regressors by taking account of their data-induced tie with the endogenous variable, the second places absolute relevance on the data-induced structure of the covariance matrix and selects the true common factors as instruments by means of formal statistical procedures. Theoretical analysis and Montecarlo simulations demonstrate that FIV is more efficient than SPCA and standard Generalized Method of Moments (GMM) even when the instruments are few and possibly weak. The prefered FIV estimation is then applied to a large dataset to test the more recent theories on the determinants of total violent crime and homicide trends in the United States for the period 1982-2005. Demographic variables, and especially abortion, law enforcement and unchecked gun availability are found to be the most significant determinants.Principal Components; Instrumental Variables; Generalized Method of Moments; Crime; Law and Order.

    Climate change: where is the hockey stick? evidence from millennial-scale reconstructed and updated temperature time series.

    Get PDF
    The goal of this paper is to test on a millennial scale the magnitude of the recent warmth period, known as the “hockey-stick”, and the relevance of the causative anthropogenic climate change hypothesis advanced by several academics and worldwide institutions. A select batch of ten long-term climate proxies, included in the NOAA 92 PCN dataset all of which running well into the nineties, is updated to the year 2011 by means of a Time-Varying Parameter Kalman Filter SISO model for state prediction. This procedure is applied by appropriately selecting as observable one out of the HADSST2 and of the HADCRUT3 series of instrumental temperature anomalies available since the year 1850. The updated proxy series are thereafter individually tested for the values and time location of their four maximum non-neighboring attained temperatures. The results are at best inconclusive, since three of the updated series, including Michael Mann’s celebrated and controversial tree-ring reconstructions, do not refute the hypothesis, while the others quite significantly point to different dates of maximum temperature achievements into the past centuries, in particular those associated to the Medieval Warm Period.Climate Change; Hockey Stick Controversy; Time Series; Kalman Filter

    Principal Components and Factor Analysis. A Comparative Study.

    Get PDF
    A comparison between Principal Component Analysis (PCA) and Factor Analysis (FA) is performed both theoretically and empirically for a random matrix X:(n x p) , where n is the number of observations and both coordinates may be very large. The comparison surveys the asymptotic properties of the factor scores, of the singular values and of all other elements involved, as well as the characteristics of the methods utilized for detecting the true dimension of X. In particular, the norms of the FA scores, whichever their number, and the norms of their covariance matrix are shown to be always smaller and to decay faster as n goes to infinity. This causes the FA scores, when utilized as regressors and/or instruments, to produce more efficient slope estimators in instrumental variable estimation. Moreover, as compared to PCA, the FA scores and factors exhibit a higher degree of consistency because the difference between the estimated and their true counterparts is smaller, and so is also the corresponding variance. Finally, FA usually selects a much less encumbering number of scores than PCA, greatly facilitating the search and identification of the common components of X.Principal Components, Factor Analysis, Matrix Norm

    Signal Optimal Smoothing by Means of Spectral Analysis

    Get PDF
    This chapter introduces two new empirical methods for obtaining optimal smoothing of noise‐ridden stationary and nonstationary, linear and nonlinear signals. Both methods utilize an application of the spectral representation theorem (SRT) for signal decomposition that exploits the dynamic properties of optimal control. The methods, named as SRT1 and SRT2, produce a low‐resolution and a high‐resolution filter, which may be utilized for optimal long‐ and short‐run tracking as well as forecasting devices. Monte Carlo simulation applied to three broad classes of signals enables comparing the dual SRT methods with a similarly optimized version of the well‐known and reputed empirical Hilbert‐Huang transform (HHT). The results point to a more satisfactory performance of the SRT methods and especially the second, in terms of low and high resolution as compared to the HHT for any of the three signal classes, in many cases also for nonlinear and stationary/nonstationary signals. Finally, all of the three methods undergo statistical experimenting on eight select real‐time data sets, which include climatic, seismological, economic and solar time series

    Testing the hockey-stick hypothesis by statistical analyses of a large dataset of proxy records.

    Get PDF
    This paper is a statistical time-series investigation addressed at testing the anthropogenic climate change hypothesis known as the “hockey-stick”. The time-series components of a select batch of 258 long-term yearly Climate Change Proxies (CCP) included in 19 paleoclimate datasets, all of which running back as far as the year 2192 B.C., are reconstructed by means of univariate Bayesian Calibration. The instrumental temperature record utilized is the Global Best Estimated Anomaly (BEA) of the HADCRUT4 time series readings available yearly for the period 1850-2010. After performing appropriate data transformations, Ordinary Least Squares parameter estimates are obtained, and subsequently simulated by means of multi-draw Gibbs sampling for each year of the pre-1850 period. The ensuing Time-Varying Parameter sequence is utilized to produce high-resolution calibrated estimates of the CCP series, merged with BEA to yield Millennial-scale Time Series (MTS). Finally, the MTS are individually tested for temperature single break date and multiple peak dates. As a result, the estimated temperature breaks and peaks suggest widespread rejection of the hockey-stick hypothesis since they are mostly centered in the Medieval Warm Period
    • 

    corecore